Sparse Online Variational Bayesian Regression
نویسندگان
چکیده
This work considers variational Bayesian inference as an inexpensive and scalable alternative to a fully approach in the context of sparsity-promoting priors. In particular, priors considered arise from scale mixtures Normal distributions with generalized inverse Gaussian mixing distribution. includes LASSO introduced [65]. It also family which more strongly promote sparsity. For linear models method requires only iterative solution deterministic least squares problems. Furthermore, for p unknown covariates can be implemented exactly online cost $O(p^3)$ computation $O(p^2)$ memory per iteration -- other words, is independent n, principle infinite data considered. large $p$ approximation able achieve promising results $O(p)$ iteration, both memory. Strategies hyper-parameter tuning are The real simulated data. shown that performance terms variable selection uncertainty quantification comparable problems tractable method, fraction cost. present comfortably handles $n = 65536$, $p 131073$ on laptop less than 30 minutes, 10^5$, 2.1 \times 10^6$ overnight.
منابع مشابه
Stochastic Variational Inference for Bayesian Sparse Gaussian Process Regression
This paper presents a novel variational inference framework for deriving a family of Bayesian sparse Gaussian process regression (SGPR) models whose approximations are variationally optimal with respect to the full-rank GPR model enriched with various corresponding correlation structures of the observation noises. Our variational Bayesian SGPR (VBSGPR) models jointly treat both the distribution...
متن کاملOnline Sparse Linear Regression
We consider the online sparse linear regression problem, which is the problem of sequentially making predictions observing only a limited number of features in each round, to minimize regret with respect to the best sparse linear regressor, where prediction accuracy is measured by square loss. We give an inefficient algorithm that obtains regret bounded by Õ( √ T ) after T prediction rounds. We...
متن کاملSparse Bayesian kernel logistic regression
In this paper we present a simple hierarchical Bayesian treatment of the sparse kernel logistic regression (KLR) model based MacKay’s evidence approximation. The model is re-parameterised such that an isotropic Gaussian prior over parameters in the kernel induced feature space is replaced by an isotropic Gaussian prior over the transformed parameters, facilitating a Bayesian analysis using stan...
متن کاملIncremental Variational Sparse Gaussian Process Regression
Recent work on scaling up Gaussian process regression (GPR) to large datasets has primarily focused on sparse GPR, which leverages a small set of basis functions to approximate the full Gaussian process during inference. However, the majority of these approaches are batch methods that operate on the entire training dataset at once, precluding the use of datasets that are streaming or too large ...
متن کاملOnline Bayesian Collaborative Topic Regression
Collaborative Topic Regression (CTR) combines ideas of probabilistic matrix factorization (PMF) and topic modeling (e.g., LDA) for recommender systems, which has gained increasing successes in many applications. Despite enjoying many advantages, the existing CTR algorithms have some critical limitations. First of all, they are often designed to work in a batch learning manner, making them unsui...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: SIAM/ASA Journal on Uncertainty Quantification
سال: 2022
ISSN: ['2166-2525']
DOI: https://doi.org/10.1137/21m1401188